Further Stability Criterion on Delayed Recurrent Neural Networks Based on Reciprocal Convex Technique

نویسندگان

  • Guobao Zhang
  • Tao Li
  • Shumin Fei
  • Zidong Wang
چکیده

Together with Lyapunov-Krasovskii functional theory and reciprocal convex technique, a new sufficient condition is derived to guarantee the global stability for recurrent neural networks with both time-varying and continuously distributed delays, in which one improved delay-partitioning technique is employed. The LMI-based criterion heavily depends on both the upper and lower bounds on state delay and its derivative, which is different from the existent ones and has more application areas as the lower bound of delay derivative is available. Finally, some numerical examples can illustrate the reduced conservatism of the derived results by thinning the delay interval.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Combined Convex Technique on Delay-Distribution-Dependent Stability for Delayed Neural Networks

Together with the Lyapunov-Krasovskii functional approach and an improved delay-partitioning idea, one novel sufficient condition is derived to guarantee a class of delayed neural networks to be asymptotically stable in the mean-square sense, in which the probabilistic variable delay and both of delay variation limits can be measured. Through combining the reciprocal convex technique and convex...

متن کامل

Robust stability of stochastic fuzzy impulsive recurrent neural networks with\ time-varying delays

In this paper, global robust stability of stochastic impulsive recurrent neural networks with time-varyingdelays which are represented by the Takagi-Sugeno (T-S) fuzzy models is considered. A novel Linear Matrix Inequality (LMI)-based stability criterion is obtained by using Lyapunov functional theory to guarantee the asymptotic stability of uncertain fuzzy stochastic impulsive recurrent neural...

متن کامل

Improved Exponential Stability Analysis for Delayed Recurrent Neural Networks

This paper studies the problem of exponential stability analysis for recurrent neural networks with time-varying delay.By establishing a suitable augmented LyapunovCKrasovskii function and a novel sufficient condition is obtained to guarantee the exponential stability of the considered system.In order to get a less conservative results of the condition,zero equalities and reciprocally convex ap...

متن کامل

A Recurrent Neural Network for Solving Strictly Convex Quadratic Programming Problems

In this paper we present an improved neural network to solve strictly convex quadratic programming(QP) problem. The proposed model is derived based on a piecewise equation correspond to optimality condition of convex (QP) problem and has a lower structure complexity respect to the other existing neural network model for solving such problems. In theoretical aspect, stability and global converge...

متن کامل

Further results on passivity analysis of delayed cellular neural networks

The passivity condition for delayed neural networks with uncertainties is considered in this article. From simple extension of a recent work for stability analysis of the system, a new criterion for the passivity of the system is derived in terms of linear matrix inequalities (LMIs), which can be easily solved by using various convex optimization algorithms. A numerical example is given to show...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014